Screen%20Shot%202019-09-10%20at%201.58.43%20AM.png

FleaBay: eCommerce OSINT & Network Analysis
September 1, 2019

Tyler Seymour
tylerseymour@protonmail.com
https://tylerseymour.pw

Introduction

Nutshell

FleaBay is an OSINT workflow for eCommerce network analysis. It maps buyers to sellers in online marketplaces by analyzing publicly available feedback and reviews. This workflow constructs a network graph relating to one or more usernames, allowing investigators to track several types of fraud and criminal activity, including money laundering, fake reviews, suspension circumvention, and cash-out routes. Specifically, FleaBay allows investigators to:

  1. Associate usernames with purchases, sales, and online retailers.
  2. Produce interactive visualizations to explore eCommerce retail networks.
  3. Find potential paths for money flow between any two usernames.

How it Works

  1. FleaBay collects reviews by a single user and finds all of the usernames that that are "1-hop" from the target. A hop refers to the distance between the target (reviewer) and the destination (buyer or seller) in a network. As used here, 1-hop usernames refers to users that were directly reviewed by the target.

  2. The tool expands the network to "2-hops" by collecting reviews for each unique 1-hop username. In this case, the path from the target user to the destination is Target --> Intermediary (1-hop) --> Destination (2-hop).

  3. FleaBay constructs a network graph comprising username (nodes) and transaction (edges). This graph represents the eCommerce network.

  4. FleaBay finds potential paths for money to flow between two users, despite intermediary or proxy accounts. This path is calculated by finding the shortest path between two nodes in a network using Dijkstra's algorithm.

Supported Marketplaces and Forums

FleaBay only supports eBay transactions at the moment. Plans to add Amazon and other eCommerce networks.

Setup

Import Dependencies

In [2]:
#################
# Basic Imports #
#################
from __future__ import print_function, division
import pandas as pd
import networkx as nx
import numpy as np
import requests
import os
import os.path
from os import path
import zipfile
import glob

###########################
# Visualization Libraries #
###########################
from pyvis import network as net
import matplotlib as mpl
import matplotlib.pyplot as plt
%matplotlib inline

#################
# Hide Warnings #
#################
import warnings
warnings.simplefilter('ignore')

##########
# Output #
##########
print()
print("Imports Complete. ")
Imports Complete. 

Import DataFrame

This function allows you to import a previously saved DataFrame and pickup where you left off. To import an existing DataFrame, type the filename in the input box. For example, try "janedoe_1hop" or "janedoe_2hop" (without quotes) to load some previously collected data. To create a new DataFrame and move onto the next step, press "Enter" without inputting any data.

In [3]:
print()
while True:
    try:
        pickle = input("Select a DataFrame, or Enter to Continue: \n\n")
        if pickle == '':
            print("No DataFrame Selected. Move on create a new network graph.")
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print()
            print("DataFrame Import Complete.")
            print(df.shape)
            display(df.head())
            break
    except FileNotFoundError:
        print()
        print("Error. Try a different file.")
Select a DataFrame, or Enter to Continue: 


No DataFrame Selected. Move on create a new network graph.

Username Data Collection

Seed Username Information

Before starting the scrape, the first step is to calculate the total number of reviews for the starting "seed" user. The number of reviews can vary from no reviews to millions of reviews per username. FleaBay uses the number of reviews to determine the total pages of reviews, at 200 reviews per page, that need to be scraped.

In [4]:
#####################
# Set Seed Username #
#####################
print()
username = input("Enter a single username to investigate: ")
print("Username to collect: " + username)
print()

###################
# User Input Loop #
###################
while True:
    try:
        ########################
        # Calculate Statistics #
        ########################
        html = ('https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=' + username + '&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1')
        tempdf = pd.read_html(html, header=0)
        tempdf = tempdf[14].copy(deep=False) #Frame 14 contains the number of reviews statistics
        fcount = tempdf.columns.get_values()[0]
        flist = fcount.split(' ')
        flist[0] = flist[0].replace(',', '')
        numReviews = int(flist[0])
        pages = (numReviews // 200) + 2
        num = pages-1
        
        ######################
        # Output Information #
        ######################  
        print()
        print("The username " + username + " has " + str(numReviews) + " reviews.")
        print("At 200 reviews per page, there are " + str(num) + " pages that need to be scraped.")
        print("For more information, view the " + username + " feedback page at: ")
        print(html)
        break
        
    ###################################
    # Handle Username Not Found Error #
    ###################################
    except IndexError:
        print()
        username = input("Error. Try a different username: ")

        
Enter a single username to investigate: janedoe
Username to collect: janedoe


The username janedoe has 152 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the janedoe feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=janedoe&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1

Scrape 1-Hop Reviews

In [5]:
##################################
# Scrape Seed Username's Reviews #
##################################

df = pd.DataFrame(columns=['Unnamed: 0', 'Feedback', 'Left for', 'When', 'Unnamed: 4', 'origin'])
for pageNumber in range(1, pages):
    html = ('https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=' + username + '&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=' + str(pageNumber))
    dftemp = pd.read_html(html, header=0)
    dftemp = dftemp[15].copy(deep=False) #Frame 15 contains the reviews
    dftemp['origin'] = username
    frames = [df, dftemp]
    df = pd.concat(frames)
    print("Scraping page No. " + str(pageNumber))

######################
# Output Information #
######################
print(df.shape)
display(df.head())
Scraping page No. 1
(308, 6)
Unnamed: 0 Feedback Left for When Unnamed: 4 origin
0 NaN Quick payment, easy transaction...Thank you! Buyer: e***e () During past month NaN janedoe
1 NaN -- -- Private NaN janedoe
2 NaN Fantastic buyer, prompt payment, so easy, than... Buyer: k***t () During past 6 months NaN janedoe
3 NaN -- -- Private NaN janedoe
4 infoDetailed item information is not available... infoDetailed item information is not available... infoDetailed item information is not available... infoDetailed item information is not available... infoDetailed item information is not available... janedoe

Cleanup the Data

In [6]:
df = df.drop(['Unnamed: 0'], axis=1)
df = df.drop(['Unnamed: 4'], axis=1)
df = df.dropna(inplace=False)
df = df[~df["Feedback"].str.contains('--')]
df = df[~df["Feedback"].str.contains('infoDetailed')]
df = df[~df["Feedback"].str.contains('Reply by')]
df.columns = ['feedback', 'type', 'when', 'origin']
df = df[~df["type"].str.contains('Buyer:')]
df = df[~df["type"].str.contains('--')]
df.reset_index(inplace=True, drop=True)
df[['type', 'username']] = df['type'].str.split(': Member ID ', expand=True)
df = df.dropna()
df['username'] = df['username'].apply(lambda x: x.split(' ')[0])
df.reset_index(inplace=True, drop=True)

print()
print("The shape of the dataframe after the first pass of cleaning is: ")
print(df.shape)
display(df.head())
The shape of the dataframe after the first pass of cleaning is: 
(77, 5)
feedback type when origin username
0 Exactly as described...Thank you! Seller During past year janedoe time-shack
1 Exactly as described, looks brand new. Thank you! Seller During past year janedoe terazwroclaw
2 Exactly as described...Thank you! Seller During past year janedoe timespotstore
3 Exactly as described...Thank you! Seller During past year janedoe dwi-international-8
4 Item exactly as described and shipped quickly.... Seller More than a year ago janedoe rnxllc

Graph 1-Hop Transform from Seed

In [7]:
plt.figure(figsize=(12, 12))
g = nx.from_pandas_edgelist(df, source='origin', target='username') 
nx.draw(g, with_labels=True)

Export Graph Image

In [8]:
print()
imageInput = input("Save PNG Image As: ")
imageName = ("./flea-exports/" + imageInput + ".png")
print("Exported image to " + imageName)
plt.savefig(imageName, transparent=True, dpi=300)
Save PNG Image As: janedoe_1hop
Exported image to ./flea-exports/janedoe_1hop.png
<Figure size 432x288 with 0 Axes>

Checkpoint: Export Dataframe

In [9]:
pickle = input("Save Dataframe as: ")
pickleName = ("./flea-exports/" + pickle + ".pkl")
df.to_pickle(pickleName)
print()
print("Exported as " + pickleName)
Save Dataframe as: janedoe_1hop

Exported as ./flea-exports/janedoe_1hop.pkl

Multi-User Data Collection

Import Dataframe

In [10]:
while True:
    try:
        pickle = input("Import Dataframe (Enter to Exit) ")
        if pickle == "":
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print("Import complete.")
            print(df.shape)
            display(df.head())
            break
    except FileNotFoundError:
        print()
        print("Error. Try a different file.")
Import Dataframe (Enter to Exit) janedoe_1hop
Import complete.
(77, 5)
feedback type when origin username
0 Exactly as described...Thank you! Seller During past year janedoe time-shack
1 Exactly as described, looks brand new. Thank you! Seller During past year janedoe terazwroclaw
2 Exactly as described...Thank you! Seller During past year janedoe timespotstore
3 Exactly as described...Thank you! Seller During past year janedoe dwi-international-8
4 Item exactly as described and shipped quickly.... Seller More than a year ago janedoe rnxllc

Expand 1-Hop Usernames from Seed

In [11]:
usernameList = list(df['username'].unique())
originList = list(df['origin'].unique())
deletedUsers = list([s for s in usernameList if "deleted" in s])
usernameList = [x for x in usernameList if x not in deletedUsers]
print()
print("This dataset includes " + str(len(usernameList)) + " unique accounts, excluding deleted user accounts.")
print()
for item in originList:
    if item not in usernameList:
        usernameList.append(originList[0])
    else:
        pass

print(usernameList)
This dataset includes 60 unique accounts, excluding deleted user accounts.

['time-shack', 'terazwroclaw', 'timespotstore', 'dwi-international-8', 'rnxllc', 'ourcedarclosets', 'listen2myreview', 'rosesboutiqueno2', 'sunsetsuppliers', 'rik-rew', 'poolsupplyworld', 'ivshc', 'carter-cam14', 'dreamjoey', 'trucestiles', 'davidv556', 'easyshopca', 'sunglass_super_zone', 'joemutt', 'eladn', 'rterryky', 'thebestcover', 'valuesmith', 'chicagocards', 'slqmaven', 'hi-autopia', 'gps-r-us', 'golfer60504', 'jkw4golf', 'martini721', '(smashing-pumpkins)', 'weekendtreasuregirl', 'percefullr', '1700gr', 'ljs9121', 'jbezazian05', 'threewands', 'golfpro777', 'golfgearandmore', 'suntechcases', 'tnm3996', 'bestdeals123abc', 'jaycsgolf', 'showtime57', 'niftiee', 'lindav1732', 'thetreasurejunkiemn', 'ceemor712', 'thegolfgodz', 'bigdshouseofgolf', 'blondeissilly', 'cgolf50', 'alpha913', 'pundsnachos', 'mybestbet', 'kgawthrope', '3ballsgolf', 'knetgolf', 'orlandoelectronic', 'azwehavealltheritestuff', 'janedoe']

Deleted Users

In [12]:
print
print("Note that the " + str(len(deletedUsers)) + " deleted user accounts that " + username + " has reviewed may be probative of maligned activity.")
print()
print(deletedUsers)
Note that the 5 deleted user accounts that janedoe has reviewed may be probative of maligned activity.

['705288808@deleted', '971464246@deleted', '789717430@deleted', '644643290@deleted', '108353277@deleted']

Add More Usernames

In [13]:
print()
while True:
    userInput = input("Add Another Username (Enter to Exit) ")
    if userInput == "":
        break
    else:
        usernameList.append(userInput)
        print("Added username: " + userInput)

deletedUsers = list([s for s in usernameList if "deleted" in s])
usernameList = [x for x in usernameList if x not in deletedUsers]

print()
print("The following usernames are on the collection list: ")
print(usernameList)
Add Another Username (Enter to Exit) alpha913
Added username: alpha913
Add Another Username (Enter to Exit) 

The following usernames are on the collection list: 
['time-shack', 'terazwroclaw', 'timespotstore', 'dwi-international-8', 'rnxllc', 'ourcedarclosets', 'listen2myreview', 'rosesboutiqueno2', 'sunsetsuppliers', 'rik-rew', 'poolsupplyworld', 'ivshc', 'carter-cam14', 'dreamjoey', 'trucestiles', 'davidv556', 'easyshopca', 'sunglass_super_zone', 'joemutt', 'eladn', 'rterryky', 'thebestcover', 'valuesmith', 'chicagocards', 'slqmaven', 'hi-autopia', 'gps-r-us', 'golfer60504', 'jkw4golf', 'martini721', '(smashing-pumpkins)', 'weekendtreasuregirl', 'percefullr', '1700gr', 'ljs9121', 'jbezazian05', 'threewands', 'golfpro777', 'golfgearandmore', 'suntechcases', 'tnm3996', 'bestdeals123abc', 'jaycsgolf', 'showtime57', 'niftiee', 'lindav1732', 'thetreasurejunkiemn', 'ceemor712', 'thegolfgodz', 'bigdshouseofgolf', 'blondeissilly', 'cgolf50', 'alpha913', 'pundsnachos', 'mybestbet', 'kgawthrope', '3ballsgolf', 'knetgolf', 'orlandoelectronic', 'azwehavealltheritestuff', 'janedoe', 'alpha913']

Scrape 2-Hop Reviews

In [14]:
userInputPages = input("Maximum Pages Deep: ")

df = pd.DataFrame(columns=['Unnamed: 0', 'Feedback',
                           'Left for', 'When', 'Unnamed: 4', 'origin'])

for username in usernameList:
    try:
        pageNumber = 1
        html = ('https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=' +
                username + '&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=' + str(pageNumber))
        tempdf = pd.read_html(html, header=0)
        tempdf = tempdf[14].copy(deep=False)
        fcount = tempdf.columns.get_values()[0]
        flist = fcount.split(' ')
        flist[0] = flist[0].replace(',', '')
        numReviews = int(flist[0])
        pages = (numReviews // 200) + 2
        num = pages-1
        print()
        print("The username " + username + " has " +
              str(numReviews) + " reviews.")
        print("At 200 reviews per page, there are " +
              str(num) + " pages that need to be scraped.")
        print("For more information, view the " +
              username + " feedback page at: ")
        print(html)
    except (ValueError, IndexError):
        continue

    for page in range(1, pages):
        if page > int(userInputPages):
            pass
        else:
            html = ('https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=' +
                    username + '&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=' + str(page))
            dftemp = pd.read_html(html, header=0)
            dftemp = dftemp[15].copy(deep=False)
            dftemp['origin'] = username
            frames = [df, dftemp]
            df = pd.concat(frames)
            print("\tScraping page No. " + str(page) + " of " + str(userInputPages) + " | " + str(num) + " total pages.")

print(df.shape)
display(df.head())
Maximum Pages Deep: 1

The username time-shack has 13149 reviews.
At 200 reviews per page, there are 66 pages that need to be scraped.
For more information, view the time-shack feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=time-shack&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 66 total pages.

The username terazwroclaw has 321 reviews.
At 200 reviews per page, there are 2 pages that need to be scraped.
For more information, view the terazwroclaw feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=terazwroclaw&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 2 total pages.

The username timespotstore has 120 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the timespotstore feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=timespotstore&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username dwi-international-8 has 51648 reviews.
At 200 reviews per page, there are 259 pages that need to be scraped.
For more information, view the dwi-international-8 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=dwi-international-8&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 259 total pages.

The username rnxllc has 547 reviews.
At 200 reviews per page, there are 3 pages that need to be scraped.
For more information, view the rnxllc feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=rnxllc&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 3 total pages.

The username ourcedarclosets has 13007 reviews.
At 200 reviews per page, there are 66 pages that need to be scraped.
For more information, view the ourcedarclosets feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=ourcedarclosets&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 66 total pages.

The username listen2myreview has 27 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the listen2myreview feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=listen2myreview&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username rosesboutiqueno2 has 14636 reviews.
At 200 reviews per page, there are 74 pages that need to be scraped.
For more information, view the rosesboutiqueno2 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=rosesboutiqueno2&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 74 total pages.

The username sunsetsuppliers has 597 reviews.
At 200 reviews per page, there are 3 pages that need to be scraped.
For more information, view the sunsetsuppliers feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=sunsetsuppliers&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 3 total pages.

The username rik-rew has 11770 reviews.
At 200 reviews per page, there are 59 pages that need to be scraped.
For more information, view the rik-rew feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=rik-rew&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 59 total pages.

The username poolsupplyworld has 338237 reviews.
At 200 reviews per page, there are 1692 pages that need to be scraped.
For more information, view the poolsupplyworld feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=poolsupplyworld&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1692 total pages.

The username ivshc has 42320 reviews.
At 200 reviews per page, there are 212 pages that need to be scraped.
For more information, view the ivshc feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=ivshc&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 212 total pages.

The username carter-cam14 has 852 reviews.
At 200 reviews per page, there are 5 pages that need to be scraped.
For more information, view the carter-cam14 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=carter-cam14&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 5 total pages.

The username dreamjoey has 9460 reviews.
At 200 reviews per page, there are 48 pages that need to be scraped.
For more information, view the dreamjoey feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=dreamjoey&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 48 total pages.

The username trucestiles has 3480 reviews.
At 200 reviews per page, there are 18 pages that need to be scraped.
For more information, view the trucestiles feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=trucestiles&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 18 total pages.

The username davidv556 has 9449 reviews.
At 200 reviews per page, there are 48 pages that need to be scraped.
For more information, view the davidv556 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=davidv556&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 48 total pages.

The username easyshopca has 84067 reviews.
At 200 reviews per page, there are 421 pages that need to be scraped.
For more information, view the easyshopca feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=easyshopca&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 421 total pages.

The username sunglass_super_zone has 19620 reviews.
At 200 reviews per page, there are 99 pages that need to be scraped.
For more information, view the sunglass_super_zone feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=sunglass_super_zone&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 99 total pages.

The username joemutt has 9427 reviews.
At 200 reviews per page, there are 48 pages that need to be scraped.
For more information, view the joemutt feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=joemutt&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 48 total pages.

The username eladn has 48 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the eladn feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=eladn&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username rterryky has 4478 reviews.
At 200 reviews per page, there are 23 pages that need to be scraped.
For more information, view the rterryky feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=rterryky&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 23 total pages.

The username thebestcover has 23365 reviews.
At 200 reviews per page, there are 117 pages that need to be scraped.
For more information, view the thebestcover feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=thebestcover&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 117 total pages.

The username valuesmith has 127152 reviews.
At 200 reviews per page, there are 636 pages that need to be scraped.
For more information, view the valuesmith feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=valuesmith&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 636 total pages.

The username chicagocards has 43448 reviews.
At 200 reviews per page, there are 218 pages that need to be scraped.
For more information, view the chicagocards feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=chicagocards&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 218 total pages.

The username slqmaven has 180555 reviews.
At 200 reviews per page, there are 903 pages that need to be scraped.
For more information, view the slqmaven feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=slqmaven&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 903 total pages.

The username hi-autopia has 544993 reviews.
At 200 reviews per page, there are 2725 pages that need to be scraped.
For more information, view the hi-autopia feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=hi-autopia&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 2725 total pages.

The username gps-r-us has 14640 reviews.
At 200 reviews per page, there are 74 pages that need to be scraped.
For more information, view the gps-r-us feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=gps-r-us&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 74 total pages.

The username golfer60504 has 13461 reviews.
At 200 reviews per page, there are 68 pages that need to be scraped.
For more information, view the golfer60504 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=golfer60504&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 68 total pages.

The username jkw4golf has 989 reviews.
At 200 reviews per page, there are 5 pages that need to be scraped.
For more information, view the jkw4golf feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=jkw4golf&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 5 total pages.

The username martini721 has 10257 reviews.
At 200 reviews per page, there are 52 pages that need to be scraped.
For more information, view the martini721 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=martini721&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 52 total pages.

The username (smashing-pumpkins) has 615 reviews.
At 200 reviews per page, there are 4 pages that need to be scraped.
For more information, view the (smashing-pumpkins) feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=(smashing-pumpkins)&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 4 total pages.

The username weekendtreasuregirl has 1771 reviews.
At 200 reviews per page, there are 9 pages that need to be scraped.
For more information, view the weekendtreasuregirl feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=weekendtreasuregirl&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 9 total pages.

The username percefullr has 41438 reviews.
At 200 reviews per page, there are 208 pages that need to be scraped.
For more information, view the percefullr feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=percefullr&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 208 total pages.

The username 1700gr has 3440 reviews.
At 200 reviews per page, there are 18 pages that need to be scraped.
For more information, view the 1700gr feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=1700gr&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 18 total pages.

The username ljs9121 has 108 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the ljs9121 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=ljs9121&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username jbezazian05 has 921 reviews.
At 200 reviews per page, there are 5 pages that need to be scraped.
For more information, view the jbezazian05 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=jbezazian05&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 5 total pages.

The username threewands has 318 reviews.
At 200 reviews per page, there are 2 pages that need to be scraped.
For more information, view the threewands feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=threewands&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 2 total pages.

The username golfpro777 has 2462 reviews.
At 200 reviews per page, there are 13 pages that need to be scraped.
For more information, view the golfpro777 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=golfpro777&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 13 total pages.

The username golfgearandmore has 6501 reviews.
At 200 reviews per page, there are 33 pages that need to be scraped.
For more information, view the golfgearandmore feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=golfgearandmore&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 33 total pages.

The username tnm3996 has 8788 reviews.
At 200 reviews per page, there are 44 pages that need to be scraped.
For more information, view the tnm3996 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=tnm3996&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 44 total pages.

The username bestdeals123abc has 148 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the bestdeals123abc feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=bestdeals123abc&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username jaycsgolf has 61690 reviews.
At 200 reviews per page, there are 309 pages that need to be scraped.
For more information, view the jaycsgolf feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=jaycsgolf&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 309 total pages.

The username showtime57 has 2048 reviews.
At 200 reviews per page, there are 11 pages that need to be scraped.
For more information, view the showtime57 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=showtime57&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 11 total pages.

The username niftiee has 292 reviews.
At 200 reviews per page, there are 2 pages that need to be scraped.
For more information, view the niftiee feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=niftiee&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 2 total pages.

The username lindav1732 has 5261 reviews.
At 200 reviews per page, there are 27 pages that need to be scraped.
For more information, view the lindav1732 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=lindav1732&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 27 total pages.

The username thetreasurejunkiemn has 965 reviews.
At 200 reviews per page, there are 5 pages that need to be scraped.
For more information, view the thetreasurejunkiemn feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=thetreasurejunkiemn&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 5 total pages.

The username ceemor712 has 12376 reviews.
At 200 reviews per page, there are 62 pages that need to be scraped.
For more information, view the ceemor712 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=ceemor712&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 62 total pages.

The username thegolfgodz has 2674 reviews.
At 200 reviews per page, there are 14 pages that need to be scraped.
For more information, view the thegolfgodz feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=thegolfgodz&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 14 total pages.

The username bigdshouseofgolf has 19941 reviews.
At 200 reviews per page, there are 100 pages that need to be scraped.
For more information, view the bigdshouseofgolf feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=bigdshouseofgolf&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 100 total pages.

The username blondeissilly has 377 reviews.
At 200 reviews per page, there are 2 pages that need to be scraped.
For more information, view the blondeissilly feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=blondeissilly&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 2 total pages.

The username cgolf50 has 2314 reviews.
At 200 reviews per page, there are 12 pages that need to be scraped.
For more information, view the cgolf50 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=cgolf50&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 12 total pages.

The username alpha913 has 142 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the alpha913 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=alpha913&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username pundsnachos has 87 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the pundsnachos feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=pundsnachos&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username mybestbet has 5871 reviews.
At 200 reviews per page, there are 30 pages that need to be scraped.
For more information, view the mybestbet feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=mybestbet&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 30 total pages.

The username kgawthrope has 3458 reviews.
At 200 reviews per page, there are 18 pages that need to be scraped.
For more information, view the kgawthrope feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=kgawthrope&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 18 total pages.

The username 3ballsgolf has 1469691 reviews.
At 200 reviews per page, there are 7349 pages that need to be scraped.
For more information, view the 3ballsgolf feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=3ballsgolf&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 7349 total pages.

The username knetgolf has 93122 reviews.
At 200 reviews per page, there are 466 pages that need to be scraped.
For more information, view the knetgolf feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=knetgolf&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 466 total pages.

The username orlandoelectronic has 11002 reviews.
At 200 reviews per page, there are 56 pages that need to be scraped.
For more information, view the orlandoelectronic feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=orlandoelectronic&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 56 total pages.

The username azwehavealltheritestuff has 462 reviews.
At 200 reviews per page, there are 3 pages that need to be scraped.
For more information, view the azwehavealltheritestuff feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=azwehavealltheritestuff&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 3 total pages.

The username janedoe has 152 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the janedoe feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=janedoe&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.

The username alpha913 has 142 reviews.
At 200 reviews per page, there are 1 pages that need to be scraped.
For more information, view the alpha913 feedback page at: 
https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=alpha913&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1
	Scraping page No. 1 of 1 | 1 total pages.
(22729, 6)
Feedback Left for Unnamed: 0 Unnamed: 4 When origin
0 Quick response and fast payment. Perfect! THAN... Buyer: g***l () NaN NaN During past month time-shack
1 Lot of 35 vintage watch mainsprings various si... -- NaN NaN View Item Lot of 35 vintage watch mainsprings ... time-shack
2 Quick response and fast payment. Perfect! THAN... Buyer: g***l () NaN NaN During past month time-shack
3 Lot of over 100 vintage watch mainsprings vari... -- NaN NaN View Item Lot of over 100 vintage watch mainsp... time-shack
4 Thank you for an easy, pleasant transaction. E... Buyer: m***r () NaN NaN During past month time-shack

Cleanup the Data

In [15]:
#######################
# Clean the DataFrame #
#######################
df = df.drop(['Unnamed: 0'], axis=1)
df = df.drop(['Unnamed: 4'], axis=1)
df = df.dropna(inplace=False)
df = df[~df["Feedback"].str.contains('--')]
df = df[~df["Feedback"].str.contains('infoDetailed')]
df = df[~df["Feedback"].str.contains('Reply by')]
df.columns = ['feedback', 'type', 'when', 'origin']
df = df[~df["type"].str.contains('Buyer:')]
df = df[~df["type"].str.contains('--')]
df.reset_index(inplace=True, drop=True)
df[['type', 'username']] = df['type'].str.split(': Member ID ', expand=True)
df = df.dropna()
df['username'] = df['username'].apply(lambda x: x.split(' ')[0])
df.reset_index(inplace=True, drop=True)

print()
print("The shape of the dataframe after the first pass of cleaning is: ")
print(df.shape)
display(df.head())
The shape of the dataframe after the first pass of cleaning is: 
(2056, 5)
feedback type when origin username
0 Thank U Seller During past 6 months terazwroclaw joecoolcigar
1 Thank U Seller During past 6 months terazwroclaw grafo-15
2 Thank U Seller During past 6 months terazwroclaw grafo-15
3 Thank U Seller During past 6 months terazwroclaw vivanista13
4 Thank U Seller During past 6 months terazwroclaw proozyoutlet

Checkpoint: Export Dataframe

In [16]:
pickle = input("Save Dataframe as: ")
pickleName = ("./flea-exports/" + pickle + ".pkl")
df.to_pickle(pickleName)
print()
print("Exported as " + pickleName)
Save Dataframe as: janedoe_2hop

Exported as ./flea-exports/janedoe_2hop.pkl

Network Visualization

Import DataFrame

In [17]:
while True:
    try:
        pickle = input("Import Dataframe (Enter to Exit): ")
        if pickle == '':
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print("Import complete.")
            print(df.shape)
            display(df.head())
            break
    except FileNotFoundError:
        print()
        print("Error. Try a different file.")
Import Dataframe (Enter to Exit): janedoe_2hop
Import complete.
(2056, 5)
feedback type when origin username
0 Thank U Seller During past 6 months terazwroclaw joecoolcigar
1 Thank U Seller During past 6 months terazwroclaw grafo-15
2 Thank U Seller During past 6 months terazwroclaw grafo-15
3 Thank U Seller During past 6 months terazwroclaw vivanista13
4 Thank U Seller During past 6 months terazwroclaw proozyoutlet

Network Graph: Spring Layout

In [18]:
print()
print("Warning: Large networks can take several minutes to generate. Be patient. ")
print()

while True:
    try:
        ########################
        # Set Layout to Spring #
        ########################
        plt.figure(figsize=(20, 20))
        g = nx.from_pandas_edgelist(df, source='origin', target='username')
        layout = nx.spring_layout(g, iterations=1000, k=1.25)

        ################
        # Lonely Nodes #
        ################
        sellerUsernames = list(df.username.unique())        
        lonelyNodes = [review for review in sellerUsernames if g.degree(review) == 0]
        g0 = g.subgraph(lonelyNodes).copy()
        nx.draw_networkx_nodes(g, layout, node_color='blue', node_size=4)
        nx.draw_networkx_edges(g, layout, width=1, edge_color="lightblue")

        ###################
        # Popular Sellers #
        ###################
        popularSellers = [review for review in sellerUsernames if g.degree(review) > 1]
        g1 = g.subgraph(popularSellers).copy()
        nx.draw_networkx_nodes(g1, layout, nodelist=popularSellers,
                               node_size=250, node_color='orange')
        nx.draw_networkx_edges(g1, layout, edge_color='orange', width=2)
        nodeLabelsPopularSellers = dict(zip(popularSellers, popularSellers))
#         nx.draw_networkx_labels(g1, layout, labels=nodeLabelsPopularSellers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')
        
        #########################
        # Originating Usernames #
        #########################
        sellerOrigin = list(df.origin.unique())
        sellerOriginSize = [g.degree(review) * 10 for review in sellerOrigin]
        g2 = g.subgraph(sellerOrigin).copy()
        nx.draw_networkx_nodes(g2, layout, nodelist=sellerOrigin,
                               node_size=sellerOriginSize, node_color='green')
        nx.draw_networkx_edges(g2, layout, edge_color='lightgreen', width=2)
        nodeLabelsOrigin = dict(zip(sellerOrigin, sellerOrigin))
        nx.draw_networkx_labels(g2, layout, labels=nodeLabelsOrigin, font_size=14,
                                font_color='k', font_family='sans-serif', font_weight='normal')
        
        #####################
        # Deleted Usernames #
        #####################
        usernameList = list(df['username'].unique())
        deletedUsers = list([s for s in usernameList if "deleted" in s])
        g3 = g.subgraph(deletedUsers).copy()
        nx.draw_networkx_nodes(g3, layout, nodelist=deletedUsers, node_size=50, node_color='red')
        nx.draw_networkx_edges(g3, layout, edge_color='pink', width=2)
        nodeLabelsDeletedUsers = dict(zip(deletedUsers, deletedUsers))
#         nx.draw_networkx_labels(g3, layout, labels=nodeLabelsDeletedUsers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')

        ########################
        # Display Plot Options #
        ########################
        plotTitle = "eBay Seller Network-Spring Layout"
        plt.axis('off')
        plt.title(plotTitle)
        plt.show()
        saveLoc = ("./flea-exports/" + plotTitle)
        plt.savefig(saveLoc, transparent=True, dpi=300)
        break
    
    except NodeNotFound:
        pass
#         sourceS = input("Error. Try a different username: ")
Warning: Large networks can take several minutes to generate. Be patient. 

<Figure size 432x288 with 0 Axes>

Network Graph: Fruchterman-Reingold Layout

In [19]:
print()
print("Warning: Large networks can take several minutes to generate. Be patient. ")
print()
while True:
    try:
        ########################
        # Set Layout to Spring #
        ########################
        plt.figure(figsize=(20, 20))
        g = nx.from_pandas_edgelist(df, source='origin', target='username')
        layout = nx.fruchterman_reingold_layout(g, iterations=500, k=1)
        
        ################
        # Lonely Nodes #
        ################
        sellerUsernames = list(df.username.unique())        
        lonelyNodes = [review for review in sellerUsernames if g.degree(review) == 0]
        g0 = g.subgraph(lonelyNodes).copy()
        nx.draw_networkx_nodes(g, layout, node_color='blue', node_size=4)
        nx.draw_networkx_edges(g, layout, width=1, edge_color="lightblue")
        
        ###################
        # Popular Sellers #
        ###################
        popularSellers = [review for review in sellerUsernames if g.degree(review) > 1]
        g1 = g.subgraph(popularSellers).copy()
        nx.draw_networkx_nodes(g1, layout, nodelist=popularSellers,
                               node_size=250, node_color='orange')
        nx.draw_networkx_edges(g1, layout, edge_color='orange', width=2)
        nodeLabelsPopularSellers = dict(zip(popularSellers, popularSellers))
#         nx.draw_networkx_labels(g1, layout, labels=nodeLabelsPopularSellers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')

        #########################
        # Originating Usernames #
        #########################
        sellerOrigin = list(df.origin.unique())
        sellerOriginSize = [g.degree(review) * 10 for review in sellerOrigin]
        g2 = g.subgraph(sellerOrigin).copy()
        nx.draw_networkx_nodes(g2, layout, nodelist=sellerOrigin,
                               node_size=sellerOriginSize, node_color='green')
        nx.draw_networkx_edges(g2, layout, edge_color='lightgreen', width=2)
        nodeLabelsOrigin = dict(zip(sellerOrigin, sellerOrigin))
        nx.draw_networkx_labels(g2, layout, labels=nodeLabelsOrigin, font_size=14,
                                font_color='k', font_family='sans-serif', font_weight='normal')
        #####################
        # Deleted Usernames #
        #####################        
        usernameList = list(df['username'].unique())
        deletedUsers = list([s for s in usernameList if "deleted" in s])
        g3 = g.subgraph(deletedUsers).copy()
        nx.draw_networkx_nodes(g3, layout, nodelist=deletedUsers, node_size=50, node_color='red')
        nx.draw_networkx_edges(g3, layout, edge_color='pink', width=2)
        nodeLabelsDeletedUsers = dict(zip(deletedUsers, deletedUsers))
#         nx.draw_networkx_labels(g3, layout, labels=nodeLabelsDeletedUsers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')

        ########################
        # Display Plot Options #
        ########################
        plotTitle = "eBay Seller Network - Fruchterman Layout"
        plt.axis('off')
        plt.title(plotTitle)
        plt.show()
        saveLoc = ("./flea-exports/" + plotTitle)
        plt.savefig(saveLoc, transparent=True, dpi=300)
        break

    except NodeNotFound:
        pass
#         sourceS = input("Error. Try a different username: ")
Warning: Large networks can take several minutes to generate. Be patient. 

<Figure size 432x288 with 0 Axes>

Network Graph: Fruchterman-Reingold Layout (Distance = 0.75)

In [20]:
print()
print("Warning: Large networks can take several minutes to generate. Be patient. ")
print()
while True:
    try:
        ########################
        # Set Layout to Spring #
        ########################
        plt.figure(figsize=(20, 20))
        g = nx.from_pandas_edgelist(df, source='origin', target='username')
        layout = nx.fruchterman_reingold_layout(g, iterations=1000, k=0.8)
        
        ################
        # Lonely Nodes #
        ################
        sellerUsernames = list(df.username.unique())        
        lonelyNodes = [review for review in sellerUsernames if g.degree(review) == 0]
        g0 = g.subgraph(lonelyNodes).copy()
        nx.draw_networkx_nodes(g, layout, node_color='blue', node_size=4)
        nx.draw_networkx_edges(g, layout, width=1, edge_color="lightblue")
        
        ###################
        # Popular Sellers #
        ###################
        popularSellers = [review for review in sellerUsernames if g.degree(review) > 1]
        g1 = g.subgraph(popularSellers).copy()
        nx.draw_networkx_nodes(g1, layout, nodelist=popularSellers,
                               node_size=250, node_color='orange')
        nx.draw_networkx_edges(g1, layout, edge_color='orange', width=2)
        nodeLabelsPopularSellers = dict(zip(popularSellers, popularSellers))
        nx.draw_networkx_labels(g1, layout, labels=nodeLabelsPopularSellers, font_size=14,
                                font_color='k', font_family='sans-serif', font_weight='normal')

        #########################
        # Originating Usernames #
        #########################
        sellerOrigin = list(df.origin.unique())
        sellerOriginSize = [g.degree(review) * 10 for review in sellerOrigin]
        g2 = g.subgraph(sellerOrigin).copy()
        nx.draw_networkx_nodes(g2, layout, nodelist=sellerOrigin,
                               node_size=sellerOriginSize, node_color='green')
        nx.draw_networkx_edges(g2, layout, edge_color='lightgreen', width=2)
        nodeLabelsOrigin = dict(zip(sellerOrigin, sellerOrigin))
#         nx.draw_networkx_labels(g2, layout, labels=nodeLabelsOrigin, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')
        #####################
        # Deleted Usernames #
        #####################        
        usernameList = list(df['username'].unique())
        deletedUsers = list([s for s in usernameList if "deleted" in s])
        g3 = g.subgraph(deletedUsers).copy()
        nx.draw_networkx_nodes(g3, layout, nodelist=deletedUsers, node_size=50, node_color='red')
        nx.draw_networkx_edges(g3, layout, edge_color='pink', width=2)
        nodeLabelsDeletedUsers = dict(zip(deletedUsers, deletedUsers))
#         nx.draw_networkx_labels(g3, layout, labels=nodeLabelsDeletedUsers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')

        ########################
        # Display Plot Options #
        ########################
        plotTitle = "eBay Seller Network - Fruchterman Layout II"
        plt.axis('off')
        plt.title(plotTitle)
        plt.show()
        saveLoc = ("./flea-exports/" + plotTitle)
        plt.savefig(saveLoc, transparent=True, dpi=300)
        break

    except NodeNotFound:
        pass
#     sourceS = input("Error. Try a different username: ")
Warning: Large networks can take several minutes to generate. Be patient. 

<Figure size 432x288 with 0 Axes>

Network Graph: Egocentric Layout

In [21]:
print()
sourceS = input("Enter a username to center graph: ")
print()
print("Warning: Large networks can take several minutes to generate. Please be patient! ")

while True:
    try:
        ########################
        # Set Layout to Spring #
        ########################
        plt.figure(figsize=(20, 20))
        g = nx.from_pandas_edgelist(df, source='origin', target='username')
        gEgo = nx.ego_graph(g,n=sourceS)                              
        g = nx.from_pandas_edgelist(df, source='origin', target='username', create_using=gEgo)
        layout = nx.spring_layout(g,iterations=500, k=0.8)
        
        ################
        # Lonely Nodes #
        ################
        sellerUsernames = list(df.username.unique())        
        lonelyNodes = [review for review in sellerUsernames if g.degree(review) == 0]
        g0 = g.subgraph(lonelyNodes).copy()
        nx.draw_networkx_nodes(g, layout, node_color='blue', node_size=4)
        nx.draw_networkx_edges(g, layout, width=1, edge_color="lightblue")
        # nodeLabelsLonelyNodes = dict(zip(lonelyNodes, lonelyNodes))        
        
        ###################
        # Popular Sellers #
        ###################
        popularSellers = [review for review in sellerUsernames if g.degree(review) > 1]
        g1 = g.subgraph(popularSellers).copy()
        nx.draw_networkx_nodes(g1, layout, nodelist=popularSellers,
                               node_size=250, node_color='orange')
        nx.draw_networkx_edges(g1, layout, edge_color='orange', width=2)
        nodeLabelsPopularSellers = dict(zip(popularSellers, popularSellers))
#         nx.draw_networkx_labels(g1, layout, labels=nodeLabelsPopularSellers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')
        
        #########################
        # Originating Usernames #
        #########################
        sellerOrigin = list(df.origin.unique())
        sellerOriginSize = [g.degree(review) * 10 for review in sellerOrigin]
        g2 = g.subgraph(sellerOrigin).copy()
        nx.draw_networkx_nodes(g2, layout, nodelist=sellerOrigin,
                               node_size=sellerOriginSize, node_color='green')
        nx.draw_networkx_edges(g2, layout, edge_color='lightgreen', width=2)
        nodeLabelsOrigin = dict(zip(sellerOrigin, sellerOrigin))
        nx.draw_networkx_labels(g2, layout, labels=nodeLabelsOrigin, font_size=14,
                                font_color='k', font_family='sans-serif', font_weight='normal')
        
        #####################
        # Deleted Usernames #
        #####################
        usernameList = list(df['username'].unique())
        deletedUsers = list([s for s in usernameList if "deleted" in s])
        g3 = g.subgraph(deletedUsers).copy()
        nx.draw_networkx_nodes(g3, layout, nodelist=deletedUsers, node_size=50, node_color='red')
        nx.draw_networkx_edges(g3, layout, edge_color='pink', width=2)
        nodeLabelsDeletedUsers = dict(zip(deletedUsers, deletedUsers))
#         nx.draw_networkx_labels(g3, layout, labels=nodeLabelsDeletedUsers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')
        
        ########################
        # Display Plot Options #
        ########################
        plotTitle = "eBay Seller Network - Egocentric Layout"
        plt.axis('off')
        plt.title(plotTitle)
        plt.show()
        saveLoc = ("./flea-exports/" + plotTitle)
        plt.savefig(saveLoc, transparent=True, dpi=300)
        break
        
    except NodeNotFound:
        pass
#         sourceS = input("Error. Try a different username: ")
Enter a username to center graph: janedoe

Warning: Large networks can take several minutes to generate. Please be patient! 
<Figure size 432x288 with 0 Axes>

Export Interactive HTML Graph

In [5]:
while True:
    try:
        pickle = input("Import Dataframe (Enter to Exit) ")
        if pickle == "":
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print("Import complete.")
            print(df.shape)
            display(df.head())
            break
    except IndexError:
        print()
        print("Error. Try a different file.")
        
sourceS = input("Source Username: ")   
graphnameInput = input("Name your graph to download or view in a new window: ")
g = nx.from_pandas_edgelist(df, source='origin', target='username')
gEgo = nx.ego_graph(g, n=sourceS)
g = nx.from_pandas_edgelist(
    df, source='origin', target='username', create_using=gEgo)

g1 = net.Network(notebook=False, height="90%", width="100%",
                 bgcolor="#222222", font_color="white")
g1.toggle_physics(True)
g1.show_buttons(filter_='physics')
g1.from_nx(g)

sources = list(df.origin.unique())
targets = list(df['username'].unique())
weights = [g.degree(etarget) * 1 for etarget in targets]
edge_data = zip(sources, targets, weights)

graphname = ("./flea-exports/" + graphnameInput + ".html")
g1.write_html(graphname, notebook=False)

cwd = os.getcwd()
user = (cwd.split("-"))

print()
print("WARNING: this is a large dataframe. This may take a few minutes depending how large your network is!")
print()
print("https://tylerseymour.pw/user/" + user[1] + "/view/flea-exports/" + graphnameInput + ".html")
print()
Import Dataframe (Enter to Exit) janedoe_2hop
Import complete.
(2056, 5)
feedback type when origin username
0 Thank U Seller During past 6 months terazwroclaw joecoolcigar
1 Thank U Seller During past 6 months terazwroclaw grafo-15
2 Thank U Seller During past 6 months terazwroclaw grafo-15
3 Thank U Seller During past 6 months terazwroclaw vivanista13
4 Thank U Seller During past 6 months terazwroclaw proozyoutlet
Source Username: janedoe
Name your graph to download or view in a new window: janedoe_2hop

WARNING: this is a large dataframe. This may take a few minutes depending how large your network is!

http://104.197.20.244/user/demo/view/flea-exports/janedoe_2hop.html

Follow the Money

Shortest Path Between Nodes

In [23]:
print()
print("Enter the source and target usernames to visualize the shortest path: ")

while True:
    try:
        #########################
        # Set Source and Target #
        #########################
        print()
        sourceS = input("Source Username: ")
        targetT = input("Target Username: ")
        print()

        ########################
        # Set Layout to Spring #
        ########################
        plt.figure(figsize=(20, 20))
        g = nx.from_pandas_edgelist(df, source='origin', target='username')
        layout = nx.spring_layout(g,iterations=500, k=0.8)
        
        ################
        # Lonely Nodes #
        ################
        sellerUsernames = list(df.username.unique())        
        lonelyNodes = [review for review in sellerUsernames if g.degree(review) == 0]
        g0 = g.subgraph(lonelyNodes).copy()
        nx.draw_networkx_nodes(g0, layout, node_color='blue', node_size=4)
        nx.draw_networkx_edges(g0, layout, width=1, edge_color="lightblue")
        nodeLabelsLonelyNodes = dict(zip(lonelyNodes, lonelyNodes))
#         nx.draw_networkx_labels(g, layout, labels=nodeLabelsLonelyNodes, font_size=6, font_color='k', font_family='sans-serif', font_weight='normal')

        ###################
        # Popular Sellers #
        ###################
        popularSellers = [review for review in sellerUsernames if g.degree(review) > 1]
        g1 = g.subgraph(popularSellers).copy()
        nx.draw_networkx_nodes(g1, layout, nodelist=popularSellers,
                               node_size=250, node_color='orange')
        nx.draw_networkx_edges(g1, layout, edge_color='orange', width=2)
        nodeLabelsPopularSellers = dict(zip(popularSellers, popularSellers))
        nx.draw_networkx_labels(g1, layout, labels=nodeLabelsPopularSellers, font_size=10, font_color='k', font_family='sans-serif', font_weight='normal')

        #########################
        # Originating Usernames #
        #########################
        sellerOrigin = list(df.origin.unique())
        sellerOriginSize = [g.degree(review) * 10 for review in sellerOrigin]
        g2 = g.subgraph(sellerOrigin).copy()
        nx.draw_networkx_nodes(g2, layout, nodelist=sellerOrigin,
                               node_size=sellerOriginSize, node_color='green')
        nx.draw_networkx_edges(g2, layout, edge_color='lightgreen', width=2)
        nodeLabelsOrigin = dict(zip(sellerOrigin, sellerOrigin))
        nx.draw_networkx_labels(g2, layout, labels=nodeLabelsOrigin, font_size=10, font_color='k', font_family='sans-serif', font_weight='normal')

        #####################
        # Deleted Usernames #
        #####################
        usernameList = list(df['username'].unique())
        deletedUsers = list([s for s in usernameList if "deleted" in s])
        g3 = g.subgraph(deletedUsers).copy()
        nx.draw_networkx_nodes(g3, layout, nodelist=deletedUsers, node_size=100, node_color='pink')
        nx.draw_networkx_edges(g3, layout, edge_color='pink', width=2)
        nodeLabelsDeletedUsers = dict(zip(deletedUsers, deletedUsers))
#         nx.draw_networkx_labels(g3, layout, labels=nodeLabelsDeletedUsers, font_size=14,
#                                 font_color='k', font_family='sans-serif', font_weight='normal')
        
        #######################################
        # Display Shortest Path Between Nodes #
        #######################################
        path = nx.shortest_path(g, source=sourceS, target=targetT, method='dijkstra')
        g9 = g.subgraph(path).copy()
        pathLabels = dict(zip(path, path))
        nx.draw_networkx_nodes(g9, layout, nodelist=path,
                               node_size=1500, node_color='blue')
        nx.draw_networkx_edges(g9, layout, edge_color='red', width=8)
        nx.draw_networkx_labels(g9, layout, labels=pathLabels, font_size=18,
                                font_color='red', font_family='sans-serif', font_weight='bold')
        
        ########################
        # Display Plot Options #
        ########################
        plotTitle = (sourceS + " -->" + targetT + "\nMoney Flow Analysis")
        plt.axis('off')
        plt.title(plotTitle)
        plt.show()
        saveLoc = ("./flea-exports/" + plotTitle)
        plt.savefig(saveLoc, transparent=True, dpi=300)
        break
        
    except NodeNotFound:
        print("Error. Either the source or target is not in the graph. Try different inputs.")
        sourceS = input("New source username: ")
        targetT = input("New target username: ")
Enter the source and target usernames to visualize the shortest path: 

Source Username: janedoe
Target Username: vm-express

<Figure size 432x288 with 0 Axes>

Ascii Flea

Screen%20Shot%202019-09-07%20at%201.08.36%20PM.png

Utilities

Import Dataframe

In [24]:
print()
while True:
    try:
        pickle = input("Import Dataframe (Enter to Exit): ")
        if pickle == "":
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print()
            print("Import complete.")
            print(df.shape)
            display(df.head())
            break
    except FileNotFoundError:
        print()
        print("Error. Try a different file.")
Import Dataframe (Enter to Exit): janedoe_1hop

Import complete.
(77, 5)
feedback type when origin username
0 Exactly as described...Thank you! Seller During past year janedoe time-shack
1 Exactly as described, looks brand new. Thank you! Seller During past year janedoe terazwroclaw
2 Exactly as described...Thank you! Seller During past year janedoe timespotstore
3 Exactly as described...Thank you! Seller During past year janedoe dwi-international-8
4 Item exactly as described and shipped quickly.... Seller More than a year ago janedoe rnxllc

Combine Dataframes

In [25]:
dfList = []
dfs = []
dfCombo = pd.DataFrame()

while True:
    try:
        dfName = input("Add a DataFrame (Enter to Exit) ")
        if dfName == "":
            break
        else:
            dfName = ("./flea-exports/" + dfName + ".pkl")
            dfList.append(dfName)
            print(dfList)
    except:
        print("Error, try another name (Enter to Exit) ")

for name in dfList:
    dfAdd = nx.read_gpickle(name)
    dfs.append(dfAdd)

dfCombo = pd.concat(dfs)
dfCombo.reset_index(inplace=True, drop=True)

comboName = input("Save Combo Frame As:")
comboName = ("./flea-exports/" + comboName + ".pkl")
dfCombo.to_pickle(comboName)

print(dfCombo.shape)
display(dfCombo.head())
Add a DataFrame (Enter to Exit) janedoe_1hop
['./flea-exports/janedoe_1hop.pkl']
Add a DataFrame (Enter to Exit) alpha913_1hop
['./flea-exports/janedoe_1hop.pkl', './flea-exports/alpha913_1hop.pkl']
Add a DataFrame (Enter to Exit) 
Save Combo Frame As:janedoe_alpha913
(130, 5)
feedback type when origin username
0 Exactly as described...Thank you! Seller During past year janedoe time-shack
1 Exactly as described, looks brand new. Thank you! Seller During past year janedoe terazwroclaw
2 Exactly as described...Thank you! Seller During past year janedoe timespotstore
3 Exactly as described...Thank you! Seller During past year janedoe dwi-international-8
4 Item exactly as described and shipped quickly.... Seller More than a year ago janedoe rnxllc

Dictionary of Usernames and Pages

In [26]:
print()
while True:
    try:
        pickle = input("Import Dataframe (Enter to Exit): ")
        if pickle == "":
            break
        else:
            pickleName = ("./flea-exports/" + pickle + ".pkl")
            df = nx.read_gpickle(pickleName)
            print()
            print("Import complete.")
            print(df.shape)
            display(df.head())
            break
    except FileNotFoundError:
        print()
        print("Error. Try a different file.")
print()
usernameList = list(df.username.unique())
usernameDict = {}
count = 0
for username in usernameList:
    try:
        html = ('https://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback2&ftab=FeedbackLeftForOthers&userid=' +
                username + '&iid=-1&de=off&items=200&searchInterval=30&mPg=2&page=1')
        tempdf = pd.read_html(html, header=0)
        feedbackCountdf = tempdf[14].copy(deep=False)
        fcount = feedbackCountdf.columns.get_values()[0]
        flist = fcount.split(' ')
        flist[0] = flist[0].replace(',', '')
        numReviews = int(flist[0])
        pages = (numReviews // 200) + 2
        usernameDict[username] = pages
        count = count + 1
        print("Success\t" + username + " | " + str(numReviews) +
              " Reviews" + " | " + str(count) + "/" + str(len(usernameList)))
        pass

    except (IndexError, ValueError):
        count = count + 1
        print("Skipped\t" + username + " | " +
              str(count) + "/" + str(len(usernameList)))
        pass
    pass

print()
print("Complete.")
Import Dataframe (Enter to Exit): janedoe_alpha913

Import complete.
(130, 5)
feedback type when origin username
0 Exactly as described...Thank you! Seller During past year janedoe time-shack
1 Exactly as described, looks brand new. Thank you! Seller During past year janedoe terazwroclaw
2 Exactly as described...Thank you! Seller During past year janedoe timespotstore
3 Exactly as described...Thank you! Seller During past year janedoe dwi-international-8
4 Item exactly as described and shipped quickly.... Seller More than a year ago janedoe rnxllc
Success	time-shack | 13150 Reviews | 1/117
Success	terazwroclaw | 321 Reviews | 2/117
Success	timespotstore | 120 Reviews | 3/117
Success	dwi-international-8 | 51648 Reviews | 4/117
Success	rnxllc | 547 Reviews | 5/117
Success	ourcedarclosets | 13007 Reviews | 6/117
Success	listen2myreview | 27 Reviews | 7/117
Success	rosesboutiqueno2 | 14636 Reviews | 8/117
Success	sunsetsuppliers | 597 Reviews | 9/117
Success	rik-rew | 11770 Reviews | 10/117
Success	poolsupplyworld | 338238 Reviews | 11/117
Success	ivshc | 42320 Reviews | 12/117
Success	carter-cam14 | 852 Reviews | 13/117
Success	dreamjoey | 9460 Reviews | 14/117
Success	trucestiles | 3480 Reviews | 15/117
Success	davidv556 | 9449 Reviews | 16/117
Success	easyshopca | 84067 Reviews | 17/117
Success	sunglass_super_zone | 19620 Reviews | 18/117
Skipped	705288808@deleted | 19/117
Success	joemutt | 9427 Reviews | 20/117
Success	eladn | 48 Reviews | 21/117
Success	rterryky | 4478 Reviews | 22/117
Success	thebestcover | 23365 Reviews | 23/117
Success	valuesmith | 127152 Reviews | 24/117
Success	chicagocards | 43448 Reviews | 25/117
Success	slqmaven | 180555 Reviews | 26/117
Success	hi-autopia | 544993 Reviews | 27/117
Success	gps-r-us | 14640 Reviews | 28/117
Success	golfer60504 | 13461 Reviews | 29/117
Success	jkw4golf | 989 Reviews | 30/117
Success	martini721 | 10257 Reviews | 31/117
Success	(smashing-pumpkins) | 615 Reviews | 32/117
Success	weekendtreasuregirl | 1771 Reviews | 33/117
Success	percefullr | 41438 Reviews | 34/117
Success	1700gr | 3440 Reviews | 35/117
Success	ljs9121 | 108 Reviews | 36/117
Success	jbezazian05 | 921 Reviews | 37/117
Success	threewands | 318 Reviews | 38/117
Success	golfpro777 | 2462 Reviews | 39/117
Success	golfgearandmore | 6501 Reviews | 40/117
Skipped	suntechcases | 41/117
Success	tnm3996 | 8788 Reviews | 42/117
Success	bestdeals123abc | 148 Reviews | 43/117
Success	jaycsgolf | 61690 Reviews | 44/117
Success	showtime57 | 2048 Reviews | 45/117
Success	niftiee | 292 Reviews | 46/117
Success	lindav1732 | 5261 Reviews | 47/117
Success	thetreasurejunkiemn | 965 Reviews | 48/117
Success	ceemor712 | 12376 Reviews | 49/117
Success	thegolfgodz | 2674 Reviews | 50/117
Success	bigdshouseofgolf | 19941 Reviews | 51/117
Success	blondeissilly | 377 Reviews | 52/117
Success	cgolf50 | 2314 Reviews | 53/117
Skipped	971464246@deleted | 54/117
Success	alpha913 | 142 Reviews | 55/117
Success	pundsnachos | 87 Reviews | 56/117
Success	mybestbet | 5871 Reviews | 57/117
Skipped	789717430@deleted | 58/117
Success	kgawthrope | 3458 Reviews | 59/117
Skipped	644643290@deleted | 60/117
Skipped	108353277@deleted | 61/117
Success	3ballsgolf | 1469691 Reviews | 62/117
Success	knetgolf | 93122 Reviews | 63/117
Success	orlandoelectronic | 11002 Reviews | 64/117
Success	azwehavealltheritestuff | 462 Reviews | 65/117
Success	quickshipgolf | 377707 Reviews | 66/117
Success	anthonyalba_44 | 2 Reviews | 67/117
Success	myfotoguy | 144 Reviews | 68/117
Success	kmd39ald | 1504 Reviews | 69/117
Success	drury491 | 553 Reviews | 70/117
Success	abby0214909 | 0 Reviews | 71/117
Success	mortoyz1 | 1278 Reviews | 72/117
Success	makespace07 | 12930 Reviews | 73/117
Success	jpram1 | 2187 Reviews | 74/117
Success	cbonhamlive | 8219 Reviews | 75/117
Success	drewrealty | 1244 Reviews | 76/117
Success	chip_mel | 240 Reviews | 77/117
Skipped	112857046@deleted | 78/117
Success	pjsol | 1400 Reviews | 79/117
Success	bill6792 | 2039 Reviews | 80/117
Skipped	872263108@deleted | 81/117
Success	thegolfcomplex | 670 Reviews | 82/117
Skipped	48329903@deleted | 83/117
Success	mike_i68 | 434 Reviews | 84/117
Skipped	kugolfer | 85/117
Success	luv3angelbabies | 3217 Reviews | 86/117
Success	golfprosfl | 3733 Reviews | 87/117
Success	geox29 | 4387 Reviews | 88/117
Success	patagonia4sell | 4866 Reviews | 89/117
Success	gadjits | 19485 Reviews | 90/117
Success	jr_circle | 16438 Reviews | 91/117
Success	peekaboofun | 22852 Reviews | 92/117
Success	network482 | 110176 Reviews | 93/117
Success	the_kite_parade | 433 Reviews | 94/117
Success	cathy917 | 2554 Reviews | 95/117
Success	ejbantz | 13528 Reviews | 96/117
Success	dogwalkgolf | 1235 Reviews | 97/117
Success	give_it_to_mikey | 748 Reviews | 98/117
Success	sell2you88 | 890 Reviews | 99/117
Success	forsoso | 4739 Reviews | 100/117
Success	sonny7104 | 715 Reviews | 101/117
Skipped	batgirl4699 | 102/117
Skipped	35403035@deleted | 103/117
Success	cotoblue | 71448 Reviews | 104/117
Success	digital4cheap | 70872 Reviews | 105/117
Success	starsdeals | 3560 Reviews | 106/117
Success	babyearth | 71516 Reviews | 107/117
Skipped	14850468@deleted | 108/117
Success	la_tronics | 226386 Reviews | 109/117
Success	golfclubexchange1 | 718 Reviews | 110/117
Success	ltlau | 17 Reviews | 111/117
Success	golfballsunlimited | 79305 Reviews | 112/117
Skipped	54944838@deleted | 113/117
Success	vintdoc | 7284 Reviews | 114/117
Success	debsatticauctions | 607 Reviews | 115/117
Skipped	60146707@deleted | 116/117
Success	elkhunter777 | 21832 Reviews | 117/117

Complete.

Inputs & Outputs

FleaBay is a versatile tool taking several inputs, outputs, and useful utilities.

Input:

    - Single Username
    - Examples: janedoe

Output:

    - 1-Hop Graph Visualization (.png); 
    - Exported Graph Data (.pkl). 

Input:

    - Multiple Usernames

Output:

    - 2-Hop Graph Visualization (.png); 
    - Interactive Graph Visualization (.html);
    - Exported Graph Data (.pkl).

Input:

    - Source and Destination Usernames.

Output:

    - Shortest path from Source to Destination Visualization (.png)


Input:

    - List of cleaned exportdataframesclea Graphs (.pkl)

Output:

    - Combined (Single) Graph Data (.pkl).